A BFGS-SQP method for nonsmooth, nonconvex, constrained optimization and its evaluation using relative minimization profiles
نویسندگان
چکیده
We propose an algorithm for solving nonsmooth, nonconvex, constrained optimization problems as well as a new set of visualization tools for comparing the performance of optimization algorithms. Our algorithm is a sequential quadratic optimization method that employs BFGS quasi-Newton Hessian approximations and an exact penalty function whose parameter is controlled using a steering strategy. We empirically validate our method using our new visualization tools, which we call relative minimization profiles. Such profiles are designed to simultaneously assess the relative performance of several algorithms with respect to three measures, highlighting the trade-offs between the measures when comparing algorithm performance on a heterogeneous test set. For example, in our experiments, we employ our algorithm to solve challenging test problems in controller design involving both locally Lipschitz and non-locally-Lipschitz functions, using relative minimization profiles to simultaneously compare our method against others in terms of objective quality, feasibility error, and speed of progress.
منابع مشابه
Robust and efficient methods for approximation and optimization of stability measures
We consider two new algorithms with practical application to the problem of designing controllers for linear dynamical systems with input and output: a new spectral value set based algorithm called hybrid expansion-contraction intended for approximating the H∞ norm, or equivalently, the complex stability radius, of large-scale systems, and a new BFGS SQP based optimization method for nonsmooth,...
متن کاملA feasible second order bundle algorithm for nonsmooth, nonconvex optimization problems with inequality constraints: I. Derivation and convergence
This paper extends the SQP-approach of the well-known bundle-Newton method for nonsmooth unconstrained minimization to the nonlinearly constrained case. Instead of using a penalty function or a filter or an improvement function to deal with the presence of constraints, the search direction is determined by solving a convex quadratically constrained quadratic program to obtain good iteration poi...
متن کاملA Sequential Quadratic Programming Algorithm for Nonconvex, Nonsmooth Constrained Optimization
We consider optimization problems with objective and constraint functions that may be nonconvex and nonsmooth. Problems of this type arise in important applications, many having solutions at points of nondifferentiability of the problem functions. We present a line search algorithm for situations when the objective and constraint functions are locally Lipschitz and continuously differentiable o...
متن کاملConvergence of the BFGS Method for LC 1 Convex Constrained Optimization 1
This paper proposes a BFGS-SQP method for linearly constrained optimization where the objective function f is only required to have a Lipschitz gradient. The KKT system of the problem is equivalent to a system of nonsmooth equations F(v) = 0. At every step a quasi-Newton matrix is updated if kF(v k)k satisses a rule. This method converges globally and the rate of convergence is su-perlinear whe...
متن کاملAn efficient one-layer recurrent neural network for solving a class of nonsmooth optimization problems
Constrained optimization problems have a wide range of applications in science, economics, and engineering. In this paper, a neural network model is proposed to solve a class of nonsmooth constrained optimization problems with a nonsmooth convex objective function subject to nonlinear inequality and affine equality constraints. It is a one-layer non-penalty recurrent neural network based on the...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Optimization Methods and Software
دوره 32 شماره
صفحات -
تاریخ انتشار 2017